-
Write something in the document below!
- There is at least one public document in every node in the Agora. Whatever you write in it will be integrated and made available for the next visitor to read and edit.
- Write to the Agora from social media.
-
Sign up as a full Agora user.
- As a full user you will be able to contribute your personal notes and resources directly to this knowledge commons. Some setup required :)
gradient
Go back to the [[AI Glossary]]
The vector of partial derivatives with respect to all of the independent variables. In machine learning, the gradient is the vector of partial derivatives of the model function. The gradient points in the direction of steepest ascent.
gradient clipping
Go back to the [[AI Glossary]]
#seq
A commonly used mechanism to mitigate the exploding gradient problem by artificially limiting (clipping) the maximum value of gradients when using gradient descent to train a model.
gradient descent
Go back to the [[AI Glossary]]
A technique to minimize loss by computing the gradients of loss with respect to the model's parameters, conditioned on training data. Informally, gradient descent iteratively adjusts parameters, gradually finding the best combination of weights and bias to minimize loss.
- public document at doc.anagora.org/gradient|gradient
- video call at meet.jit.si/gradient|gradient